Supervised Deep Sparse Coding Networks

نویسندگان

  • Xiaoxia Sun
  • Nasser M. Nasrabadi
  • Trac D. Tran
چکیده

In this paper, we propose a novel multilayer sparse coding network capable of efficiently adapting its own regularization parameters to a given dataset. The network is trained end-to-end with a supervised task-driven learning algorithm via error backpropagation. During training, the network learns both the dictionaries and the regularization parameters of each sparse coding layer so that the reconstructive dictionaries are smoothly transformed into increasingly discriminative representations. We also incorporate a new weighted sparse coding scheme into our sparse recovery procedure, offering the system more flexibility to adjust sparsity levels. Furthermore, we have devised a sparse coding layer utilizing a ’skinny’ dictionary. Integral to computational efficiency, these skinny dictionaries compress the high dimensional sparse codes into lower dimensional structures. The adaptivity and discriminability of our 13-layer sparse coding network are demonstrated on four benchmark datasets, namely Cifar-10, Cifar-100, SVHN and MNIST, most of which are considered difficult for sparse coding models. Experimental results show that our architecture overwhelmingly outperforms traditional onelayer sparse coding architectures while using much fewer parameters. Moreover, our multilayer architecture fuses the benefits of depth with sparse coding’s characteristic ability to operate on smaller datasets. In such data-constrained scenarios, we demonstrate our technique can overcome the limitations of deep neural networks by exceeding the state of the art in accuracy.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Sparse Coding on Stereo Video for Object Detection

Deep Convolutional Neural Networks (DCNN) require millions of labeled training examples for image classification and object detection tasks, which restrict these models to domains where such datasets are available. In this paper, we explore the use of unsupervised sparse coding applied to stereo-video data to help alleviate the need for large amounts of labeled data. We show that replacing a ty...

متن کامل

The Role of Architectural and Learning Constraints in Neural Network Models: A Case Study on Visual Space Coding

The recent "deep learning revolution" in artificial neural networks had strong impact and widespread deployment for engineering applications, but the use of deep learning for neurocomputational modeling has been so far limited. In this article we argue that unsupervised deep learning represents an important step forward for improving neurocomputational models of perception and cognition, becaus...

متن کامل

Deep Sparse Rectifier Neural Networks

Rectifying neurons are more biologically plausible than logistic sigmoid neurons, which are themselves more biologically plausible than hyperbolic tangent neurons. However, the latter work better for training multi-layer neural networks than logistic sigmoid neurons. This paper shows that networks of rectifying neurons yield equal or better performance than hyperbolic tangent networks in spite ...

متن کامل

Hashing by Deep Learning

During the past decade (since around 2006), Deep Learning [7], also known as Deep Neural Networks, has drawn increasing attention and research efforts in a variety of artificial intelligence areas including speech recognition, computer vision, machine learning, text mining, etc. Since one main purpose of deep learning is to learn robust and powerful feature representations for complex data, it ...

متن کامل

Deep Rewiring: Training very sparse deep networks

Neuromorphic hardware tends to pose limits on the connectivity of deep networks that one can run on them. But also generic hardware and software implementations of deep learning run more efficiently on sparse networks. Several methods exist for pruning connections of a neural network after it was trained without connectivity constraints. We present an algorithm, DEEP R, that enables us to train...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017